Goto

Collaborating Authors

 hyperbolic embedding


Tree! Iamno Tree! Iama Low Dimensional Hyperbolic Embedding

Neural Information Processing Systems

Note havethatd(z, w)=( y, z)w ifandonlyd(z, w)=( x, z)w. InProceedingsof the Twenty-sixth Annual ACMSymposiumon Principlesof Distributed Computing, PODC '07, pages 43-52, New York, NY, USA, 2007.


HyperMiner: Topic Taxonomy Mining with Hyperbolic Embedding

Neural Information Processing Systems

Embedded topic models are able to learn interpretable topics even with large and heavy-tailed vocabularies. However, they generally hold the Euclidean embedding space assumption, leading to a basic limitation in capturing hierarchical relations. To this end, we present a novel framework that introduces hyperbolic embeddings to represent words and topics. With the tree-likeness property of hyperbolic space, the underlying semantic hierarchy among words and topics can be better exploited to mine more interpretable topics. Furthermore, due to the superiority of hyperbolic geometry in representing hierarchical data, tree-structure knowledge can also be naturally injected to guide the learning of a topic hierarchy. Therefore, we further develop a regularization term based on the idea of contrastive learning to inject prior structural knowledge efficiently. Experiments on both topic taxonomy discovery and document representation demonstrate that the proposed framework achieves improved performance against existing embedded topic models.


Hyperbolic Embeddings of Supervised Models

Neural Information Processing Systems

Models of hyperbolic geometry have been successfully used in ML for two main tasks: embedding models in unsupervised learning (e.g. To our knowledge, there are no approaches that provide embeddings for supervised models; even when hyperbolic geometry provides convenient properties for expressing popular hypothesis classes, such as decision trees (and ensembles).In this paper, we propose a full-fledged solution to the problem in three independent contributions. The second resolving an issue for a clean, unambiguous embedding of (ensembles of) decision trees in this model. The third showing how to smoothly tweak the Poincar\'e hyperbolic distance to improve its encoding and visualization properties near the border of the disk, a crucial region for our application, while keeping hyperbolicity.This last step has substantial independent interest as it is grounded in a generalization of Leibniz-Newton's fundamental Theorem of calculus.


Learning Structured Representations with Hyperbolic Embeddings

Neural Information Processing Systems

Most real-world datasets consist of a natural hierarchy between classes or an inherent label structure that is either already available or can be constructed cheaply. However, most existing representation learning methods ignore this hierarchy, treating labels as permutation invariant. Recent work [Zeng et al., 2022] proposes using this structured information explicitly, but the use of Euclidean distance may distort the underlying semantic context [Chen et al., 2013]. In this work, motivated by the advantage of hyperbolic spaces in modeling hierarchical relationships, we propose a novel approach HypStructure: a Hyperbolic Structured regularization approach to accurately embed the label hierarchy into the learned representations. HypStructure is a simple-yet-effective regularizer that consists of a hyperbolic tree-based representation loss along with a centering loss, and can be combined with any standard task loss to learn hierarchy-informed features.


HyperMiner: Topic Taxonomy Mining with Hyperbolic Embedding

Neural Information Processing Systems

Embedded topic models are able to learn interpretable topics even with large and heavy-tailed vocabularies. However, they generally hold the Euclidean embedding space assumption, leading to a basic limitation in capturing hierarchical relations. To this end, we present a novel framework that introduces hyperbolic embeddings to represent words and topics. With the tree-likeness property of hyperbolic space, the underlying semantic hierarchy among words and topics can be better exploited to mine more interpretable topics. Furthermore, due to the superiority of hyperbolic geometry in representing hierarchical data, tree-structure knowledge can also be naturally injected to guide the learning of a topic hierarchy.


Review-Based Cross-Domain Recommendation via Hyperbolic Embedding and Hierarchy-Aware Domain Disentanglement

Choi, Yoonhyuk

arXiv.org Artificial Intelligence

The issue of data sparsity poses a significant challenge to recommender systems. In response to this, algorithms that leverage side information such as review texts have been proposed. Furthermore, Cross-Domain Recommendation (CDR), which captures domainshareable knowledge and transfers it from a richer domain (source) to a sparser one (target), has received notable attention. Nevertheless, the majority of existing methodologies assume a Euclidean embedding space, encountering difficulties in accurately representing richer text information and managing complex interactions between users and items. This paper advocates a hyperbolic CDR approach based on review texts for modeling user-item relationships. We first emphasize that conventional distance-based domain Figure 1: The geometric properties of Euclidean (E, left) and alignment techniques may cause problems because small modifications Hyperbolic spaces (H, right). A hyperbolic space leverages in hyperbolic geometry result in magnified perturbations, the advantages of a wide space by placing nodes with high ultimately leading to the collapse of hierarchical structures. To address degrees close to the origin. However, common methods in this challenge, we propose hierarchy-aware embedding and recommender systems bring the relevant nodes closer, leading domain alignment schemes that adjust the scale to extract domainshareable to a structural collapse in the hyperbolic geometry information without disrupting structural forms.


Leveraging Hyperbolic Embeddings for Coarse-to-Fine Robot Design

Dong, Heng, Zhang, Junyu, Zhang, Chongjie

arXiv.org Artificial Intelligence

Multi-cellular robot design aims to create robots comprised of numerous cells that can be efficiently controlled to perform diverse tasks. Previous research has demonstrated the ability to generate robots for various tasks, but these approaches often optimize robots directly in the vast design space, resulting in robots with complicated morphologies that are hard to control. In response, this paper presents a novel coarse-to-fine method for designing multi-cellular robots. Initially, this strategy seeks optimal coarse-grained robots and progressively refines them. To mitigate the challenge of determining the precise refinement juncture during the coarse-to-fine transition, we introduce the Hyperbolic Embeddings for Robot Design (HERD) framework. HERD unifies robots of various granularity within a shared hyperbolic space and leverages a refined Cross-Entropy Method for optimization. This framework enables our method to autonomously identify areas of exploration in hyperbolic space and concentrate on regions demonstrating promise. Finally, the extensive empirical studies on various challenging tasks sourced from EvoGym show our approach's superior efficiency and generalization capability.


Hyperbolic Embeddings for Learning Options in Hierarchical Reinforcement Learning

Tiwari, Saket, Prannoy, M.

arXiv.org Artificial Intelligence

Hierarchical reinforcement learning deals with the problem of breaking down large tasks into meaningful sub-tasks. Autonomous discovery of these sub-tasks has remained a challenging problem. We propose a novel method of learning sub-tasks by combining paradigms of routing in computer networks and graph based skill discovery within the options framework to define meaningful sub-goals. We apply the recent advancements of learning embeddings using Riemannian optimisation in the hyperbolic space to embed the state set into the hyperbolic space and create a model of the environment. In doing so we enforce a global topology on the states and are able to exploit this topology to learn meaningful sub-tasks. We demonstrate empirically, both in discrete and continuous domains, how these embeddings can improve the learning of meaningful sub-tasks.